Predictive Entropy Search for Bayesian Optimization with Unknown Constraints Supplementary Material
نویسندگان
چکیده
PESC computes a Gaussian approximation to the NFCPD (main text, Eq. (11)) using Expectation Propagation (EP) (Minka, 2001). EP is a method for approximating a product of factors (often a single prior factor and multiple likelihood factors) with a tractable distribution, for example a Gaussian. EP generates a Gaussian approximation by approximating each individual factor with a Gaussian. The product all these Gaussians results in a single Gaussian distribution that approximates the product of all the exact factors. This is in contrast to the Laplace approximation which fits a single Gaussian distribution to the whole posterior. EP can be intuitively understood as fitting the individual Gaussian approximations by minimizing the Kullback-Leibler (KL) divergences between each exact factor and its corresponding Gaussian approximation. This would correspond to matching first and second moments between exact and approximate factors. However, EP does this moment matching in the context of all the other approximate factors, since we are ultimately interested in having a good approximation in regions where the overall posterior probability is high. Concretely, assume we wish to approximate the distribution
منابع مشابه
Predictive Entropy Search for Bayesian Optimization with Unknown Constraints
Unknown constraints arise in many types of expensive black-box optimization problems. Several methods have been proposed recently for performing Bayesian optimization with constraints, based on the expected improvement (EI) heuristic. However, EI can lead to pathologies when used with constraints. For example, in the case of decoupled constraints—i.e., when one can independently evaluate the ob...
متن کاملPredictive Entropy Search for Multi-objective Bayesian Optimization with Constraints
This work presents PESMOC, Predictive Entropy Search for Multi-objective Bayesian Optimization with Constraints, an information-based strategy for the simultaneous optimization of multiple expensive-to-evaluate black-box functions under the presence of several constraints. PESMOC can hence be used to solve a wide range of optimization problems. Iteratively, PESMOC chooses an input location on w...
متن کاملLookahead Bayesian Optimization with Inequality Constraints
We consider the task of optimizing an objective function subject to inequality constraints when both the objective and the constraints are expensive to evaluate. Bayesian optimization (BO) is a popular way to tackle optimization problems with expensive objective function evaluations, but has mostly been applied to unconstrained problems. Several BO approaches have been proposed to address expen...
متن کاملMax-value Entropy Search for Efficient Bayesian Optimization
Entropy Search (ES) and Predictive Entropy Search (PES) are popular and empirically successful Bayesian Optimization techniques. Both rely on a compelling information-theoretic motivation, and maximize the information gained about the arg max of the unknown function; yet, both are plagued by the expensive computation for estimating entropies. We propose a new criterion, Max-value Entropy Search...
متن کاملPredictive Entropy Search for Efficient Global Optimization of Black-box Functions
We propose a novel information-theoretic approach for Bayesian optimization called Predictive Entropy Search (PES). At each iteration, PES selects the next evaluation point that maximizes the expected information gained with respect to the global maximum. PES codifies this intractable acquisition function in terms of the expected reduction in the differential entropy of the predictive distribut...
متن کامل